Goto

Collaborating Authors

 Daqing




IDEA: An Invariant Perspective for Efficient Domain Adaptive Image Retrieval

Neural Information Processing Systems

More importantly, we employ a generative model for synthetic samples to simulate the intervention of various non-causal effects, thereby minimizing their impact on hash codes for domain invariance. Comprehensive experiments conducted on benchmark datasets confirm the superior performance of our proposed IDEA compared to a variety of competitive baselines.


Domain Re-Modulation for Few-Shot Generative Domain Adaptation Yi Wu, Ziqiang Li University of Science and Technology of China Chaoyue Wang, Heliang Zheng, Shanshan Zhao JD Explore Academy Bin Li

Neural Information Processing Systems

In this study, we delve into the task of few-shot Generative Domain Adaptation (GDA), which involves transferring a pre-trained generator from one domain to a new domain using only a few reference images. Inspired by the way human brains acquire knowledge in new domains, we present an innovative generator structure called Domain Re-Modulation (DoRM) .




DropPos: Pre-Training Vision Transformers by Reconstructing Dropped Positions

Neural Information Processing Systems

To answer this question, we begin by revisiting the forward procedure of ViTs. A sequence of positional embeddings (PEs) [51] is added to patch embeddings to preserve position information. Intuitively, simply discarding these PEs and requesting the model to reconstruct the position for each patch naturally becomes a qualified location-aware pretext task.